feat(show): add serialization-tree via DeepEval.serialize#5316
feat(show): add serialization-tree via DeepEval.serialize#5316njzjz-bot wants to merge 17 commits intodeepmodeling:masterfrom
Conversation
Authored by OpenClaw (model: gpt-5.2)
|
Implements #5185 (first step): add powered by a backend-unified wrapper.\n\nAuthored by OpenClaw (model: gpt-5.2) |
|
Implements #5185 (first step): add Authored by OpenClaw (model: gpt-5.2) |
|
(FYI) Previous comment got mangled by shell backticks; this one is the corrected text.\n\nAuthored by OpenClaw (model: gpt-5.2) |
for more information, see https://pre-commit.ci
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds a "serialization-tree" CLI attribute, implements Changes
Sequence DiagramsequenceDiagram
participant CLI as CLI Parser
participant Show as Show Entrypoint
participant DeepEval as DeepEval Interface
participant Backend as Backend Implementation
participant Node as Node (Serialization)
CLI->>Show: invoke show with ATTRIBUTES includes "serialization-tree"
Show->>DeepEval: call serialize()
DeepEval->>Backend: resolve backend (may use model_file) and call serialize()
Backend-->>DeepEval: return serialized dict (model payload)
Show->>Node: Node.deserialize(serialized_payload)
Node-->>Show: serialization tree string
Show->>Show: log serialization tree
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
deepmd/infer/deep_eval.py (1)
444-446: Cyclic import flagged by static analysis.CodeQL reports a cyclic import starting from
deepmd.pretrained.deep_eval. While placing the import inside the method body mitigates runtime issues (the import only occurs when the pretrained branch is taken), consider whether these utilities could be imported from a lower-level module to avoid the cycle entirely.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@deepmd/infer/deep_eval.py` around lines 444 - 446, The import of parse_pretrained_alias inside deepmd.infer.deep_eval creates a cyclic dependency with deepmd.pretrained.deep_eval; to fix it, extract parse_pretrained_alias (and any small helper utilities it needs) into a lower-level module (e.g., deepmd.pretrained.utils or deepmd.utils.pretrained) and update both deepmd.pretrained.deep_eval and deepmd.infer.deep_eval to import parse_pretrained_alias from that new module; ensure the extracted function has no imports back to deepmd.pretrained.deep_eval to break the cycle and run tests to confirm no runtime regressions.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@deepmd/infer/deep_eval.py`:
- Around line 443-454: When resolving a pretrained alias you re-detect the
backend and immediately call serialize_hook on backend_cls (via
Backend.detect_backend_by_model and backend_cls().serialize_hook) without
checking that the backend supports the IO feature; add the same IO capability
check used in the regular path before calling serialize_hook: after determining
backend_cls = Backend.detect_backend_by_model(resolved), verify
Backend.feature(backend_cls, Backend.Feature.IO) (or equivalent feature-check
method used elsewhere) and raise the same NotImplementedError with the same
message if IO is not supported, otherwise call
backend_cls().serialize_hook(resolved).
---
Nitpick comments:
In `@deepmd/infer/deep_eval.py`:
- Around line 444-446: The import of parse_pretrained_alias inside
deepmd.infer.deep_eval creates a cyclic dependency with
deepmd.pretrained.deep_eval; to fix it, extract parse_pretrained_alias (and any
small helper utilities it needs) into a lower-level module (e.g.,
deepmd.pretrained.utils or deepmd.utils.pretrained) and update both
deepmd.pretrained.deep_eval and deepmd.infer.deep_eval to import
parse_pretrained_alias from that new module; ensure the extracted function has
no imports back to deepmd.pretrained.deep_eval to break the cycle and run tests
to confirm no runtime regressions.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: a8a6ae18-b1d4-465f-b5b3-2149b4fbd411
📒 Files selected for processing (3)
deepmd/entrypoints/show.pydeepmd/infer/deep_eval.pydeepmd/main.py
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## master #5316 +/- ##
==========================================
- Coverage 82.29% 79.74% -2.56%
==========================================
Files 775 814 +39
Lines 77627 84485 +6858
Branches 3676 4050 +374
==========================================
+ Hits 63887 67370 +3483
- Misses 12566 15891 +3325
- Partials 1174 1224 +50 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Switch DeepEval.serialize() to delegate to DeepEvalBackend.serialize(), and implement serialize() in each backend by calling the underlying model's serialize(). Also move Node import in dp show to module top-level. Authored by OpenClaw (model: gpt-5.2)
|
Refactor update (per feedback): switch from
Latest commit: 1694360 Authored by OpenClaw (model: gpt-5.2) |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@deepmd/pt_expt/infer/deep_eval.py`:
- Around line 668-673: serialize() currently returns the raw model_dict from
serialize_from_file(self.model_path) but callers (e.g., the CLI in show.py)
expect a wrapper with keys "backend", "model", "model_def_script", and
"@variables"; change serialize() in the class to call
serialize_from_file(self.model_path), then construct and return a dict like
{"backend": "<appropriate backend name or value from model_dict if available>",
"model": model_dict, "model_def_script": <script or None>, "@variables": <vars
dict or empty dict>} so the returned structure contains those keys; ensure you
pull any available values from model_dict to populate "backend",
"model_def_script", and "@variables" or set sensible defaults if missing.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: f7533fe2-e54e-4e12-ae64-e1712fa5c00b
📒 Files selected for processing (9)
deepmd/dpmodel/infer/deep_eval.pydeepmd/entrypoints/show.pydeepmd/infer/deep_eval.pydeepmd/jax/infer/deep_eval.pydeepmd/pd/infer/deep_eval.pydeepmd/pretrained/deep_eval.pydeepmd/pt/infer/deep_eval.pydeepmd/pt_expt/infer/deep_eval.pydeepmd/tf/infer/deep_eval.py
🚧 Files skipped from review as they are similar to previous changes (1)
- deepmd/entrypoints/show.py
There was a problem hiding this comment.
Pull request overview
This PR adds a new serialization-tree option to the dp show command by introducing a backend-unified DeepEval.serialize() API and implementing it across supported inference backends.
Changes:
- Added
DeepEvalBackend.serialize()(and wrapperDeepEval.serialize()) to return a unified serialized-model dictionary. - Implemented backend-specific
serialize()methods for TF / PyTorch / Paddle / DPModel / JAX / pt_expt (+ pretrained delegator). - Extended
dp showCLI attribute choices and entrypoint logic to print a model serialization tree usingNode.deserialize(...).
Reviewed changes
Copilot reviewed 10 out of 10 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
deepmd/infer/deep_eval.py |
Adds serialize() to the low-level backend interface and exposes it on the high-level DeepEval wrapper. |
deepmd/tf/infer/deep_eval.py |
Implements TF serialization via graph_def loading + Model.serialize(), including optional min_nbor_dist. |
deepmd/pt/infer/deep_eval.py |
Implements PyTorch serialization via self.dp.model["Default"].serialize(). |
deepmd/pd/infer/deep_eval.py |
Implements Paddle serialization via self.dp.model["Default"].serialize(). |
deepmd/dpmodel/infer/deep_eval.py |
Implements DPModel serialization via self.dp.serialize(). |
deepmd/jax/infer/deep_eval.py |
Adds JAX serialization method and includes jax_version. |
deepmd/pt_expt/infer/deep_eval.py |
Implements serialization by delegating to serialize_from_file(...). |
deepmd/pretrained/deep_eval.py |
Delegates serialization to the resolved backend for pretrained aliases. |
deepmd/entrypoints/show.py |
Adds serialization-tree printing using Node.deserialize(data["model"]). |
deepmd/main.py |
Adds serialization-tree to the dp show CLI ATTRIBUTES choices list. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Route JAX DeepEval serialization through the existing file-based serializer so .hlo and .savedmodel models follow the supported path instead of calling unimplemented model-level serialize() methods. Also add the missing pt_version field to the PyTorch backend serializer and wrap pt_expt serialization in the backend-unified payload expected by dp show serialization-tree. Add a targeted pt_expt serialization contract test. Authored by OpenClaw (model: gpt-5.4)
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@deepmd/jax/infer/deep_eval.py`:
- Around line 190-195: Update the serialize method in deep_eval.DeepEval (the
serialize(self) -> dict[str, Any] function that currently calls
serialize_from_file(self.model_path)) to explicitly document and guard against
TensorFlow SavedModel inputs: add a docstring note explaining that JAX backend
only supports serializing .jax/.hlo directories and that .savedmodel
(TensorFlow-wrapped) models are not supported, and add a pre-check that inspects
self.model_path (and/or the model wrapper type, e.g., TFModelWrapper if
accessible) to raise a clear ValueError with a descriptive message like
"serialize() not supported for .savedmodel / TFModelWrapper: JAX backend only
supports converting .jax/.hlo directories" before calling serialize_from_file.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 5017e4a9-2d01-483b-a3ea-85d28ab8a405
📒 Files selected for processing (4)
deepmd/jax/infer/deep_eval.pydeepmd/pt/infer/deep_eval.pydeepmd/pt_expt/infer/deep_eval.pysource/tests/pt_expt/infer/test_deep_eval.py
🚧 Files skipped from review as they are similar to previous changes (1)
- deepmd/pt_expt/infer/deep_eval.py
Route dp show serialization-tree through backend file serialization hooks and pass only the serialized model payload into Node.deserialize(). This keeps Node focused on model structure instead of backend-specific envelope fields. Add a focused unit test that locks the behavior at the entrypoint layer. Authored by OpenClaw (model: gpt-5.4)
Make DeepEval.serialize() return only the model serialization tree instead of a backend metadata envelope. This lets callers such as dp show serialization-tree consume model structure directly without caring about backend-specific data wrappers. Keep full backend data serialization in backend serialize hooks for IO and backend conversion paths. Authored by OpenClaw (model: gpt-5.4)
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
deepmd/jax/infer/deep_eval.py (1)
190-198:⚠️ Potential issue | 🟠 Major
serialize()is incompatible with.savedmodelinputs currently accepted by this backend.Line 195 calls
serialize_from_file(self.model_path), but.savedmodelpaths (accepted in__init__) are not supported by that serializer, sodp show ... serialization-treewill fail for those models.Suggested guard for a clearer failure mode
def serialize(self) -> dict[str, Any]: from deepmd.jax.utils.serialization import ( serialize_from_file, ) + if str(self.model_path).endswith(".savedmodel"): + raise NotImplementedError( + "serialize() is not supported for .savedmodel in JAX backend; " + "only JAX-native serialized models are supported." + ) data = serialize_from_file(self.model_path) if "model" not in data: raise RuntimeError("Serialized model data does not contain key 'model'.") return data["model"]🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@deepmd/jax/infer/deep_eval.py` around lines 190 - 198, The serialize() method calls serialize_from_file(self.model_path) which doesn't support TensorFlow .savedmodel directories accepted by the class __init__; update serialize() to detect .savedmodel inputs (e.g. check if self.model_path endswith or is a directory with saved_model.pb) and either call a dedicated serializer for SavedModel paths or raise a clear RuntimeError indicating .savedmodel is unsupported by serialize_from_file, mentioning self.model_path and serialize_from_file in the message so dp show ... serialization-tree fails with a clear, actionable error instead of an internal exception.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@deepmd/pd/infer/deep_eval.py`:
- Around line 733-735: The serialize() implementation assumes
self.dp.model["Default"] but fails when self.dp is already a static model
object; update serialize() to handle both shapes: if self.dp is a static model
(e.g., has a serialize method or lacks a .model dict) call and return
self.dp.serialize(); otherwise, keep the existing behavior of retrieving model =
self.dp.model["Default"] and returning model.serialize(); use hasattr checks (or
isinstance if a concrete class exists) to detect which branch to take and avoid
indexing into .model when it's not a mapping.
---
Duplicate comments:
In `@deepmd/jax/infer/deep_eval.py`:
- Around line 190-198: The serialize() method calls
serialize_from_file(self.model_path) which doesn't support TensorFlow
.savedmodel directories accepted by the class __init__; update serialize() to
detect .savedmodel inputs (e.g. check if self.model_path endswith or is a
directory with saved_model.pb) and either call a dedicated serializer for
SavedModel paths or raise a clear RuntimeError indicating .savedmodel is
unsupported by serialize_from_file, mentioning self.model_path and
serialize_from_file in the message so dp show ... serialization-tree fails with
a clear, actionable error instead of an internal exception.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 22858c7f-cc6a-436c-bb71-bece5d4f54f5
📒 Files selected for processing (10)
deepmd/dpmodel/infer/deep_eval.pydeepmd/entrypoints/show.pydeepmd/infer/deep_eval.pydeepmd/jax/infer/deep_eval.pydeepmd/pd/infer/deep_eval.pydeepmd/pt/infer/deep_eval.pydeepmd/pt_expt/infer/deep_eval.pydeepmd/tf/infer/deep_eval.pysource/tests/pt_expt/infer/test_deep_eval.pysource/tests/test_entrypoint_show_serialization_tree.py
✅ Files skipped from review due to trivial changes (1)
- source/tests/pt_expt/infer/test_deep_eval.py
🚧 Files skipped from review as they are similar to previous changes (4)
- deepmd/pt_expt/infer/deep_eval.py
- deepmd/infer/deep_eval.py
- deepmd/dpmodel/infer/deep_eval.py
- deepmd/entrypoints/show.py
Assert in the consistent IO deep-eval test that DeepEval.serialize() returns the same model tree as the backend serialize hook payload. This keeps the new serialize semantics covered in the shared cross-backend IO test. Authored by OpenClaw (model: gpt-5.4)
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@source/tests/consistent/io/test_io.py`:
- Line 164: The test currently uses self.assertEqual(deep_eval.serialize(),
serialized_data["model"]) which can fail on numpy.ndarray leaves; replace this
with a NumPy-aware comparison using
numpy.testing.assert_equal(deep_eval.serialize(), serialized_data["model"]) (or
np.testing.assert_equal if numpy is imported as np) to correctly handle array
comparisons; ensure numpy is imported in the test module (e.g., import numpy as
np) if not already and update the assertion in the test that calls
deep_eval.serialize() and compares to serialized_data["model"] accordingly.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 3b9c2d6d-4aa2-4d0d-bc03-3d6b78b1d5cd
📒 Files selected for processing (1)
source/tests/consistent/io/test_io.py
Handle Paddle static models when serializing through DeepEval by falling back to self.dp when it is not a ModelWrapper. Also use numpy-aware equality in the consistent IO serialize check. Authored by OpenClaw (model: gpt-5.4)
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
@/tmp/pr-body-5185.md
Summary by CodeRabbit
New Features
Tests